chore(beep boop 🤖): Bump uv.lock (main, mcore-dev) (2026-04-14)#3319
Closed
svcnvidia-nemo-ci wants to merge 5 commits intomainfrom
Closed
chore(beep boop 🤖): Bump uv.lock (main, mcore-dev) (2026-04-14)#3319svcnvidia-nemo-ci wants to merge 5 commits intomainfrom
uv.lock (main, mcore-dev) (2026-04-14)#3319svcnvidia-nemo-ci wants to merge 5 commits intomainfrom
Conversation
Signed-off-by: github-actions[bot] <github-actions[bot]@users.noreply.github.com>
Contributor
Author
|
/ok to test 4fa68c2 |
Contributor
|
No actionable comments were generated in the recent review. 🎉 ℹ️ Recent review info⚙️ Run configurationConfiguration used: Path: .coderabbit.yaml Review profile: CHILL Plan: Pro Plus Run ID: ⛔ Files ignored due to path filters (1)
📒 Files selected for processing (2)
📝 WalkthroughWalkthroughThis PR updates two commit references: the Changes
Estimated code review effort🎯 2 (Simple) | ⏱️ ~8 minutes Possibly related PRs
Suggested reviewers
🚥 Pre-merge checks | ✅ 3 | ❌ 1❌ Failed checks (1 warning)
✅ Passed checks (3 passed)
✏️ Tip: You can configure your own custom pre-merge checks in the settings. ✨ Finishing Touches🧪 Generate unit tests (beta)
Comment |
…r and hybrid_context_parallel state.py: wrap `EnergyMonitor` import in try/except (not yet in mcore dev), add runtime guard before instantiation. initialize.py: use inspect.signature to conditionally pass `hybrid_context_parallel` kwarg to initialize_model_parallel (only in mcore dev, not yet in main). Signed-off-by: yaoyu-33 <yaoyu.094@gmail.com> Made-with: Cursor
Contributor
|
/ok to test ce40a56 |
…nting and state mcore-dev does not have `async_strategy` param in `dist_checkpointing.save()` or the `get_async_strategy` helper. Use `inspect.signature` to conditionally pass `async_strategy`, and fall back to direct imports from mcore core modules when `get_async_strategy` is unavailable. Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com> Signed-off-by: yaoyu-33 <yaoyu.094@gmail.com>
Contributor
|
/ok to test be0b8da |
…ompat MCore dev passes single_grouped_weight and single_grouped_bias to TE GroupedLinear.__init__() when TE >= 2.14.0, but some TE 2.14.0 builds only expose a unified single_grouped_parameter kwarg. Monkey-patch GroupedLinear.__init__() to remap the split kwargs to the unified one when needed. Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com> Signed-off-by: yaoyu-33 <yaoyu.094@gmail.com>
_fuse_moe_expert_weights loads tensors via safetensors (memory-mapped), then writes back to the same file path. On some filesystems (e.g. Docker overlay in CI), this causes EFAULT because save_file truncates the file while the kept tensors still reference the mmap'd regions. Cloning the kept tensors before saving avoids this. Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com> Signed-off-by: yaoyu-33 <yaoyu.094@gmail.com>
Contributor
|
/ok to test d73e29e |
1 task
Contributor
|
Closing stale automated bump PR. |
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Add this suggestion to a batch that can be applied as a single commit.This suggestion is invalid because no changes were made to the code.Suggestions cannot be applied while the pull request is closed.Suggestions cannot be applied while viewing a subset of changes.Only one suggestion per line can be applied in a batch.Add this suggestion to a batch that can be applied as a single commit.Applying suggestions on deleted lines is not supported.You must change the existing code in this line in order to create a valid suggestion.Outdated suggestions cannot be applied.This suggestion has been applied or marked resolved.Suggestions cannot be applied from pending reviews.Suggestions cannot be applied on multi-line comments.Suggestions cannot be applied while the pull request is queued to merge.Suggestion cannot be applied right now. Please check back later.
🚀 PR to bump
uv.lockinmain.🤖 This PR will be merged automatically once CI passes.
Summary by CodeRabbit